Ab>initio, Big Data, Informatica, Tableau, Data Architect, Cognos, Microstrategy, Healther Business Analysts, Cloud etc.
at Exusia
About Exusia
Similar jobs
Role - MIS Executive
Experience - 2 to 4 years
Job Location - Ahmedabad
- Maintenance of existing management information systems.
- Generate and distribute management reports in an accurate and timely manner.
- Use Advanced Excel capabilities, including pivot tables, look-ups, complex formulas and graphing to streamline business processes.
- Ability to understand complex data, analyze and make reports and dashboards
- Extract the data from the designated portal and update it.
- Provide recommendations to update current MIS to improve reporting efficiency and consistency.
- Perform data analysis for generating reports on a periodic basis.
- Provide strong reporting and analytical information support to the management team.
- Generate both periodic and ad hoc reports as required.
- Analyze business information to identify process improvements for increasing business efficiency and effectiveness.
- Provide support and assistance to management in issue troubleshooting and resolution.
- Handling database management by using Advanced Excel tools & MS Access
- Should be proficient with Advanced Excel Formulas such as, Pivot Table, Lookups, Index Formatting, Conditional Formatting.
- Exp in Tableau, Dashboard creation & Data crunching & extraction
- Qualification: Bachelor's degree with experience of 2 - 4 Years
- 2 to 4 Years experience in MIS and Dashboarding is a must.
Informatica PowerCenter, Informatica Change Data Capture, Azure SQL, Azure Data Lake
Job Description
Minimum of 15 years of Experience with Informatica ETL, Database technologies Experience with Azure database technologies including Azure SQL Server, Azure Data Lake Exposure to Change data capture technology Lead and guide development of an Informatica based ETL architecture. Develop solution in highly demanding environment and provide hands on guidance to other team members. Head complex ETL requirements and design. Implement an Informatica based ETL solution fulfilling stringent performance requirements. Collaborate with product development teams and senior designers to develop architectural requirements for the requirements. Assess requirements for completeness and accuracy. Determine if requirements are actionable for ETL team. Conduct impact assessment and determine size of effort based on requirements. Develop full SDLC project plans to implement ETL solution and identify resource requirements. Perform as active, leading role in shaping and enhancing overall ETL Informatica architecture and Identify, recommend and implement ETL process and architecture improvements. Assist and verify design of solution and production of all design phase deliverables. Manage build phase and quality assure code to ensure fulfilling requirements and adhering to ETL architecture.
Job Description:
The data science team is responsible for solving business problems with complex data. Data complexity could be characterized in terms of volume, dimensionality and multiple touchpoints/sources. We understand the data, ask fundamental-first-principle questions, apply our analytical and machine learning skills to solve the problem in the best way possible.
Our ideal candidate
The role would be a client facing one, hence good communication skills are a must.
The candidate should have the ability to communicate complex models and analysis in a clear and precise manner.
The candidate would be responsible for:
- Comprehending business problems properly - what to predict, how to build DV, what value addition he/she is bringing to the client, etc.
- Understanding and analyzing large, complex, multi-dimensional datasets and build features relevant for business
- Understanding the math behind algorithms and choosing one over another
- Understanding approaches like stacking, ensemble and applying them correctly to increase accuracy
Desired technical requirements
- Proficiency with Python and the ability to write production-ready codes.
- Experience in pyspark, machine learning and deep learning
- Big data experience, e.g. familiarity with Spark, Hadoop, is highly preferred
- Familiarity with SQL or other databases.
Power BI Engineer
Company Overview:
At Codvo, software and people transformations go hand-in-hand. We are a global empathy-led technology services company. Product innovation and mature software engineering are part of our core DNA. Respect, Fairness, Growth, Agility, and Inclusiveness are the core values that we aspire to live by each day.
We continue to expand our digital strategy, design, architecture, and product management capabilities to offer expertise, outside-the-box thinking, and measurable results.
What we're looking for :
- You should have strong technical and analytical skill with more in SQL Server, reporting tools like Tableau, Power BI, SSRS and .Net.
- You have experience in OLAP, UI/UX and Dashboard building.
- You should have experience for proper understanding of the project deliverables.
- You should possess excellent communication skills.
Responsibilities:
- You will be responsible for the respective tasks assigned in the project.
- You will be responsible for the deliverable with proper quality, in planned time and cost adhering to the industry standards that will be defined for the project.
- You will be involved in client interaction.
Senior SRE - Acceldata (IC3 Level)
About the Job
You will join a team of highly skilled engineers who are responsible for delivering Acceldata’s support services. Our Site Reliability Engineers are trained to be active listeners and demonstrate empathy when customers encounter product issues. In our fun and collaborative environment Site Reliability Engineers develop strong business, interpersonal and technical skills to deliver high-quality service to our valued customers.
When you arrive for your first day, we’ll want you to have:
- Solid skills in troubleshooting to repair failed products or processes on a machine or a system using a logical, systematic search for the source of a problem in order to solve it, and make the product or process operational again
- A strong ability to understand the feelings of our customers as we empathize with them on the issue at hand
- A strong desire to increase your product and technology skillset; increase- your confidence supporting our products so you can help our customers succeed
In this position you will…
- Provide Support Services to our Gold & Enterprise customers using our flagship Acceldata Pulse,Flow & Torch Product suits. This may include assistance provided during the engineering and operations of distributed systems as well as responses for mission-critical systems and production customers.
- Demonstrate the ability to actively listen to customers and show empathy to the customer’s business impact when they experience issues with our products
- Participate in the queue management and coordination process by owning customer escalations, managing the unassigned queue.
- Be involved with and work on other support related activities - Performing POC & assisting Onboarding deployments of Acceldata & Hadoop distribution products.
- Triage, diagnose and escalate customer inquiries when applicable during their engineering and operations efforts.
- Collaborate and share solutions with both customers and the Internal team.
- Investigate product related issues both for particular customers and for common trends that may arise
- Study and understand critical system components and large cluster operations
- Differentiate between issues that arise in operations, user code, or product
- Coordinate enhancement and feature requests with product management and Acceldata engineering team.
- Flexible in working in Shifts.
- Participate in a Rotational weekend on-call roster for critical support needs.
- Participate as a designated or dedicated engineer for specific customers. Aspects of this engagement translates to building long term successful relationships with customers, leading weekly status calls, and occasional visits to customer sites
In this position, you should have…
- A strong desire and aptitude to become a well-rounded support professional. Acceldata Support considers the service we deliver as our core product.
- A positive attitude towards feedback and continual improvement
- A willingness to give direct feedback to and partner with management to improve team operations
- A tenacity to bring calm and order to the often stressful situations of customer cases
- A mental capability to multi-task across many customer situations simultaneously
- Bachelor degree in Computer Science or Engineering or equivalent experience. Master’s degree is a plus
- At least 2+ years of experience with at least one of the following cloud platforms: Amazon Web Services (AWS), Microsoft Azure, Google Cloud Platform (GCP), experience with managing and supporting a cloud infrastructure on any of the 3 platforms. Also knowledge on Kubernetes, Docker is a must.
- Strong troubleshooting skills (in example, TCP/IP, DNS, File system, Load balancing, database, Java)
- Excellent communication skills in English (written and verbal)
- Prior enterprise support experience in a technical environment strongly preferred
Strong Hands-on Experience Working With Or Supporting The Following
- 8-12 years of Experience with a highly-scalable, distributed, multi-node environment (50+ nodes)
- Hadoop operation including Zookeeper, HDFS, YARN, Hive, and related components like the Hive metastore, Cloudera Manager/Ambari, etc
- Authentication and security configuration and tuning (KNOX, LDAP, Kerberos, SSL/TLS, second priority: SSO/OAuth/OIDC, Ranger/Sentry)
- Java troubleshooting, e.g., collection and evaluation of jstacks, heap dumps
You might also have…
- Linux, NFS, Windows, including application installation, scripting, basic command line
- Docker and Kubernetes configuration and troubleshooting, including Helm charts, storage options, logging, and basic kubectl CLI
- Experience working with scripting languages (Bash, PowerShell, Python)
- Working knowledge of application, server, and network security management concepts
- Familiarity with virtual machine technologies
- Knowledge of databases like MySQL and PostgreSQL,
- Certification on any of the leading Cloud providers (AWS, Azure, GCP ) and/or Kubernetes is a big plus
The right person in this role has an opportunity to make a huge impact at Acceldata and add value to our future decisions. If this position has piqued your interest and you have what we described - we invite you to apply! An adventure in data awaits.
Learn more at https://www.acceldata.io/about-us">https://www.acceldata.io/about-us
Company Name: Intraedge Technologies Ltd (https://intraedge.com/" target="_blank">https://intraedge.com/)
Type: Permanent, Full time
Location: Any
A Bachelor’s degree in computer science, computer engineering, other technical discipline, or equivalent work experience
- 4+ years of software development experience
- 4+ years exp in programming languages- Python, spark, Scala, Hadoop, hive
- Demonstrated experience with Agile or other rapid application development methods
- Demonstrated experience with object-oriented design and coding.
Please mail you rresume to poornimakattherateintraedgedotcomalong with NP, how soon can you join, ECTC, Availability for interview, Location
Python + Data scientist : |
• Build data-driven models to understand the characteristics of engineering systems |
• Train, tune, validate, and monitor predictive models |
• Sound knowledge on Statistics |
• Experience in developing data processing tasks using PySpark such as reading, merging, enrichment, loading of data from external systems to target data destinations |
• Working knowledge on Big Data or/and Hadoop environments |
• Experience creating CI/CD Pipelines using Jenkins or like tools |
• Practiced in eXtreme Programming (XP) disciplines |